Hierarchical Orthogonal Factorization: Sparse Square Matrices

نویسندگان

چکیده

In this work, we develop a new fast algorithm, spaQR -- sparsified QR, for solving large, sparse linear systems. The key to our approach is using low-rank approximations sparsify the separators in Nested Dissection based Householder QR factorization. First, modified version of used identify interiors/separators and reorder matrix. Then, classical factorize interiors, going from leaves root elimination tree. After every level interior factorization, remaining by approximations. This operation reduces size without introducing any fill-in However, it introduces small approximation error which can be controlled user. resulting approximate factorization stored as sequence orthogonal upper-triangular factors. Hence, applied efficiently solve Additionally, further improve algorithm block diagonal scaling. show systematic analysis effectiveness Finally, perform numerical tests on benchmark unsymmetric problems evaluate performance algorithm. time scales $\mathcal{O}(N \log N)$ $\mathcal{O}(N)$.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Sparse Gaussian Elimination and Orthogonal Factorization

We consider the solution of a linear system Ax = b on a distributedmemorymachine when the matrixA has full rank and is large, sparse and nonsymmetric. We use our Cartesian nested dissection algorithm to compute a ll-reducingcolumn ordering of the matrix. We develop algorithms that use the associated separator tree to estimate the structure of the factor and to distribute and perform numeric com...

متن کامل

Sparse block factorization of saddle point matrices

The factorization method presented in this paper takes advantage of the special structures and properties of saddle point matrices. A variant of Gaussian elimination equivalent to the Cholesky’s factorization is suggested and implemented for factorizing the saddle point matrices block-wise with small blocks of order 1 and 2. The Gaussian elimination applied to these small blocks on block level ...

متن کامل

Fast hierarchical solvers for sparse matrices

Sparse linear systems are ubiquitous in various scientific computing applications. Inversion of sparse matrices with standard direct solve schemes are prohibitive for large systems due to their quadratic/cubic complexity. Iterative solvers, on the other hand, demonstrate better scalability. However, they suffer from poor convergence rates when used without a preconditioner. There are many preco...

متن کامل

Orthogonal Connectivity Factorization: Interpretable Decomposition of Variability in Correlation Matrices

In many multivariate time series, the correlation structure is nonstationary, that is, it changes over time. The correlation structure may also change as a function of other cofactors, for example, the identity of the subject in biomedical data. A fundamental approach for the analysis of such data is to estimate the correlation structure (connectivities) separately in short time windows or for ...

متن کامل

Square ice, alternating sign matrices and classical orthogonal polynomials

The six-vertex model with Domain Wall Boundary Conditions, or square ice, is considered for particular values of its parameters, corresponding to 1-, 2-, and 3-enumerations of Alternating Sign Matrices (ASMs). Using Hankel determinant representations for the partition function and the boundary correlator of homogeneous square ice, it is shown how the ordinary and refined enumerations can be der...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: SIAM Journal on Matrix Analysis and Applications

سال: 2022

ISSN: ['1095-7162', '0895-4798']

DOI: https://doi.org/10.1137/20m1373475